Asymptotic Normality of Nonparametric Kernel Type Deconvolution Density Estimators: crossing the Cauchy boundary
نویسنده
چکیده
We derive asymptotic normality of kernel type deconvolution density estimators. In particular we consider deconvolution problems where the known component of the convolution has a symmetric λ-stable distribution, 0 < λ ≤ 2. It turns out that the limit behavior changes if the exponent parameter λ passes the value one, the case of Cauchy deconvolution. AMS classification: primary 62G05; secondary 62E20
منابع مشابه
Asymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data
Kernel density estimators are the basic tools for density estimation in non-parametric statistics. The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in which the bandwidth is varied depending on the location of the sample points. In this paper, we initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...
متن کاملCombining kernel estimators in the uniform deconvolution problem
We construct a density estimator and an estimator of the distribution function in the uniform deconvolution model. The estimators are based on inversion formulas and kernel estimators of the density of the observations and its derivative. Asymptotic normality and the asymptotic biases are derived. AMS classification: primary 62G05; secondary 62E20, 62G07, 62G20
متن کاملGamma Kernel Estimators for Density and Hazard Rate of Right-Censored Data
The nonparametric estimation for the density and hazard rate functions for right-censored data using the kernel smoothing techniques is considered. The “classical” fixed symmetric kernel type estimator of these functions performs well in the interior region, but it suffers from the problem of bias in the boundary region. Here, we propose new estimators based on the gamma kernels for the density...
متن کاملVariance estimation in nonparametric regression via the difference sequence method (short title: Sequence-based variance estimation)
Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for ...
متن کاملVariance Estimation in Nonparametric Regression via the Difference Sequence Method by Lawrence
Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for ...
متن کامل